Iterative Algorithms for Graphical Models 1

نویسنده

  • Robert Mateescu
چکیده

Probabilistic inference in Bayesian networks, and even reasoning within error bounds are known to be NP-hard problems. Our research focuses on investigating approximate message-passing algorithms inspired by Pearl’s belief propagation algorithm and by variable elimination. We study the advantages of bounded inference provided by anytime schemes such as Mini-Clustering (MC), and combine them with the virtues of iterative algorithms such as Iterative Belief Propagation (IBP). Our resulting hybrid algorithm Iterative Join-Graph Propagation (IJGP) is shown empirically to surpass the performance of both MC and IBP on several classes of networks. IJGP can also be viewed as a Generalized Belief Propagation algorithm, a framework which recently allowed connections with approximate algorithms from statistical physics, showing that convergence points are in fact stationary points of the Bethe (or the more general Kikuchi) free energy. Although there is still little understanding why or when IBP works well, it exhibits tremendous performance on different classes of problems, most notably coding and satisfiability problems. We investigate the iterative algorithms for Bayesian networks by making connections with well known constraint processing algorithms, which help explain when IBP infers correctly extreme beliefs. This study gives an account of why iterating helps, and identifies classes of easy and hard problems for IBP (and IJGP). Finally, we plan to investigate iterative message-passing algorithms in other graph-based frameworks such as influence diagrams and planning.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

IPF for Discrete Chain Factor Graphs

Iterative Proportional Fitting (IPF), com­ bined with EM, is commonly used as an al­ gorithm for likelihood maximization in undi­ rected graphical models. In this paper, we present two iterative algorithms that gener­ alize upon IPF. The first one is for likelihood maximization in discrete chain factor graphs, which we define as a wide class of discrete variable models including undirected grap...

متن کامل

Pushing the Power of Stochastic Greedy Ordering Schemes for Inference in Graphical Models

We study iterative randomized greedy algorithms for generating (elimination) orderings with small induced width and state space size two parameters known to bound the complexity of inference in graphical models. We propose and implement the Iterative Greedy Variable Ordering (IGVO) algorithm, a new variant within this algorithm class. An empirical evaluation using different ranking functions an...

متن کامل

A Low Complexity Circuit Architecture for Rapid PN Code Acquisition in UWB Systems Using Iterative Message Passing on Redundant Graphical Models

Rapidly acquiring the code phase of the spreading sequence in an ultra-wideband system is a very difficult problem. In this paper, we present a new iterative algorithm and its hardware architecture in detail. Our algorithm is based on running iterative message passing algorithms on a standard graphical model augmented with multiple redundant models. Simulation results show that our new algorith...

متن کامل

Thesis Proposal Parallel Learning and Inference in Probabilistic Graphical Models

Probabilistic graphical models are one of the most influential and widely used techniques in machine learning. Powered by exponential gains in processor technology, graphical models have been successfully applied to a wide range of increasingly large and complex real-world problems. However, recent developments in computer architecture, large-scale computing, and data-storage have shifted the f...

متن کامل

Iterative Decoding of Compound Codes by Probability Propagation in Graphical Models

We present a unified graphical model framework for describing compound codes and deriving iterative decoding algorithms. After reviewing a variety of graphical models (Markov random fields, Tanner graphs, and Bayesian networks), we derive a general distributed marginalization algorithm for functions described by factor graphs. From this general algorithm, Pearl’s belief propagation algorithm is...

متن کامل

WOLFE: An NLP-friendly Declarative Machine Learning Stack

Developing machine learning algorithms for natural language processing (NLP) applications is inherently an iterative process, involving a continuous refinement of the choice of model, engineering of features, selection of inference algorithms, search for the right hyperparameters, and error analysis. Existing probabilistic program languages (PPLs) only provide partial solutions; most of them do...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003